553 research outputs found
ProLanGO: Protein Function Prediction Using Neural~Machine Translation Based on a Recurrent Neural Network
With the development of next generation sequencing techniques, it is fast and
cheap to determine protein sequences but relatively slow and expensive to
extract useful information from protein sequences because of limitations of
traditional biological experimental techniques. Protein function prediction has
been a long standing challenge to fill the gap between the huge amount of
protein sequences and the known function. In this paper, we propose a novel
method to convert the protein function problem into a language translation
problem by the new proposed protein sequence language "ProLan" to the protein
function language "GOLan", and build a neural machine translation model based
on recurrent neural networks to translate "ProLan" language to "GOLan"
language. We blindly tested our method by attending the latest third Critical
Assessment of Function Annotation (CAFA 3) in 2016, and also evaluate the
performance of our methods on selected proteins whose function was released
after CAFA competition. The good performance on the training and testing
datasets demonstrates that our new proposed method is a promising direction for
protein function prediction. In summary, we first time propose a method which
converts the protein function prediction problem to a language translation
problem and applies a neural machine translation model for protein function
prediction.Comment: 13 pages, 5 figure
Semi-Autoregressive Neural Machine Translation
Existing approaches to neural machine translation are typically
autoregressive models. While these models attain state-of-the-art translation
quality, they are suffering from low parallelizability and thus slow at
decoding long sequences. In this paper, we propose a novel model for fast
sequence generation --- the semi-autoregressive Transformer (SAT). The SAT
keeps the autoregressive property in global but relieves in local and thus is
able to produce multiple successive words in parallel at each time step.
Experiments conducted on English-German and Chinese-English translation tasks
show that the SAT achieves a good balance between translation quality and
decoding speed. On WMT'14 English-German translation, the SAT achieves
5.58 speedup while maintains 88\% translation quality, significantly
better than the previous non-autoregressive methods. When produces two words at
each time step, the SAT is almost lossless (only 1\% degeneration in BLEU
score).Comment: EMNLP 201
- …